132 research outputs found

    Degree Complexity of a Family of Birational Maps

    Full text link
    We compute the degree complexity of a family of birational mappings of the plane with high order singularities

    Statistical physics of low density parity check error correcting codes

    Get PDF
    We study the performance of Low Density Parity Check (LDPC) error-correcting codes using the methods of statistical physics. LDPC codes are based on the generation of codewords using Boolean sums of the original message bits by employing two randomly-constructed sparse matrices. These codes can be mapped onto Ising spin models and studied using common methods of statistical physics. We examine various regular constructions and obtain insight into their theoretical and practical limitations. We also briefly report on results obtained for irregular code constructions, for codes with non-binary alphabet, and on how a finite system size effects the error probability

    Critical Noise Levels for LDPC decoding

    Get PDF
    We determine the critical noise level for decoding low density parity check error correcting codes based on the magnetization enumerator (\cM), rather than on the weight enumerator (\cW) employed in the information theory literature. The interpretation of our method is appealingly simple, and the relation between the different decoding schemes such as typical pairs decoding, MAP, and finite temperature decoding (MPM) becomes clear. In addition, our analysis provides an explanation for the difference in performance between MN and Gallager codes. Our results are more optimistic than those derived via the methods of information theory and are in excellent agreement with recent results from another statistical physics approach.Comment: 9 pages, 5 figure

    Exponential lower bound on the highest fidelity achievable by quantum error-correcting codes

    Full text link
    On a class of memoryless quantum channels which includes the depolarizing channel, the highest fidelity of quantum error-correcting codes of length n and rate R is proven to be lower bounded by 1-exp[-nE(R)+o(n)] for some function E(R). The E(R) is positive below some threshold R', which implies R' is a lower bound on the quantum capacity.Comment: Ver.4. In vers.1--3, I claimed Theorem 1 for general quantum channels. Now I claim this only for a slight generalization of depolarizing channel in this paper because Lemma 2 in vers.1--3 was wrong; the original general statement is proved in quant-ph/0112103. Ver.5. Text sectionalized. Appeared in PRA. The PRA article is typographically slightly crude: The LaTeX symbol star, used as superscripts, was capriciously replaced by the asterisk in several places after my proof readin

    Tighter decoding reliability bound for Gallager's error-correcting code

    Get PDF
    Statistical physics is employed to evaluate the performance of error-correcting codes in the case of finite message length for an ensemble of Gallager's error correcting codes. We follow Gallager's approach of upper-bounding the average decoding error rate, but invoke the replica method to reproduce the tightest general bound to date, and to improve on the most accurate zero-error noise level threshold reported in the literature. The relation between the methods used and those presented in the information theory literature are explored

    Statistical Mechanics of Broadcast Channels Using Low Density Parity Check Codes

    Get PDF
    We investigate the use of Gallager's low-density parity-check (LDPC) codes in a broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple timesharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based timesharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the timesharing limit.Comment: 14 pages, 4 figure

    Statistical Mechanics of Broadcast Channels Using Low Density Parity Check Codes

    Get PDF
    We investigate the use of Gallager's low-density parity-check (LDPC) codes in a broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple timesharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based timesharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the timesharing limit.Comment: 14 pages, 4 figure

    Statistical mechanics of typical set decoding

    Get PDF
    The performance of ``typical set (pairs) decoding'' for ensembles of Gallager's linear code is investigated using statistical physics. In this decoding, error happens when the information transmission is corrupted by an untypical noise or two or more typical sequences satisfy the parity check equation provided by the received codeword for which a typical noise is added. We show that the average error rate for the latter case over a given code ensemble can be tightly evaluated using the replica method, including the sensitivity to the message length. Our approach generally improves the existing analysis known in information theory community, which was reintroduced by MacKay (1999) and believed as most accurate to date.Comment: 7 page
    corecore